Text prediction recurrent neural networks using long short-term memory-dropout
نویسندگان
چکیده
<span lang="EN-US">Unit short-term memory (LSTM) is a type of recurrent neural network (RNN) whose sequence-based models are being used in text generation and/or prediction tasks, question answering, and classification systems due to their ability learn long-term dependencies. The present research integrates the LSTM dropout technique generate from corpus as input, model developed find best way extract words context. For training model, poem "</span><em><span lang="EN-US">La Ciudad y los perros</span></em><span lang="EN-US">" which composed 128,600 input data. was divided into two data sets, 38.88% for remaining 61.12% testing model. proposed tested variants: word importance results were evaluated terms semantic proximity generated given context.</span>
منابع مشابه
Efficient Short-Term Electricity Load Forecasting Using Recurrent Neural Networks
Short term load forecasting (STLF) plays an important role in the economic and reliable operation ofpower systems. Electric load demand has a complex profile with many multivariable and nonlineardependencies. In this study, recurrent neural network (RNN) architecture is presented for STLF. Theproposed model is capable of forecasting next 24-hour load profile. The main feature in this networkis ...
متن کاملSimplified Long Short-term Memory Recurrent Neural Networks: part I
We present five variants of the standard Long Shortterm Memory (LSTM) recurrent neural networks by uniformly reducing blocks of adaptive parameters in the gating mechanisms. For simplicity, we refer to these models as LSTM1, LSTM2, LSTM3, LSTM4, and LSTM5, respectively. Such parameter-reduced variants enable speeding up data training computations and would be more suitable for implementations o...
متن کاملSimplified Long Short-term Memory Recurrent Neural Networks: part III
This is part III of three-part work. In parts I and II, we have presented eight variants for simplified Long Short Term Memory (LSTM) recurrent neural networks (RNNs). It is noted that fast computation, specially in constrained computing resources, are an important factor in processing big timesequence data. In this part III paper, we present and evaluate two new LSTM model variants which drama...
متن کاملSimplified Long Short-term Memory Recurrent Neural Networks: part II
This is part II of three-part work. Here, we present a second set of inter-related five variants of simplified Long Short-term Memory (LSTM) recurrent neural networks by further reducing adaptive parameters. Two of these models have been introduced in part I of this work. We evaluate and verify our model variants on the benchmark MNIST dataset and assert that these models are comparable to the ...
متن کاملLanguage Identification in Short Utterances Using Long Short-Term Memory (LSTM) Recurrent Neural Networks
Long Short Term Memory (LSTM) Recurrent Neural Networks (RNNs) have recently outperformed other state-of-the-art approaches, such as i-vector and Deep Neural Networks (DNNs), in automatic Language Identification (LID), particularly when dealing with very short utterances (∼3s). In this contribution we present an open-source, end-to-end, LSTM RNN system running on limited computational resources...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Indonesian Journal of Electrical Engineering and Computer Science
سال: 2023
ISSN: ['2502-4752', '2502-4760']
DOI: https://doi.org/10.11591/ijeecs.v29.i3.pp1758-1768